Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Sharon Zhou"


6 mentions found


Mark Zuckerberg just dropped Meta's new AI models. Bt what Llama 3 does not do is beat OpenAI's GPT-4. AdvertisementPerhaps the most important detail was that Meta's open-source models will soon be on par with their closed-source counterparts. For open-source AI developers, that'll be a huge deal. The AI models they were working on looked pretty rudimentary as recently as last year as they struggled to complete sentences without repeating themselves.
Persons: Mark Zuckerberg, OpenAI's GPT, , Meta, Nick Clegg, Claude, OpenAI's, that'll, Sharon Zhou, Sam Altman's, Sam Altman, Justin Sullivan, Google's, That's, Jim Fan, Altman, Zuckerberg, Gary Marcus Organizations: Service, Financial, Google, GPT, Nvidia, Facebook, New York University
The Santa Clara giant's chips, known as GPUs, became the hottest property of the generative AI boom. In April last year, Zhou and her cofounder Greg Diamos, based in Palo Alto, brought their new startup, Lamini AI, out of stealth. It makes using AI models with GPUs like the H100 and Nvidia's new Blackwell chip, as simple as a plug-and-play system. Fortunately for them, after consulting with Diamos, according to Zhou, AMD was on its way to building a rival system that they would eventually test. it's indiscernible to customers to run Lamini on Nvidia and AMD GPUs," she explained.
Persons: , giant's, Mark Zuckerberg, Sam Altman, hasn't, Jensen Huang, Sharon Zhou, Andrew Ng, Zhou, Greg Diamos, Lisa Su Organizations: Service, Nvidia, Business, Harvard, Stanford, Anthropic, Amazon, AMD Locations: Santa, Palo Alto, OpenAI
The race to build AI as smart as humans, or AGI, looks like it suffered a major blow. Google researchers found the transformer technology behind AI isn't very good at generalizing. AdvertisementAdvertisementGoogle researchers may have just given a major reality check to the ambitions of CEOs in chase of AI's holy grail. As it stands, AI is pretty good at specific tasks but less great at transferring skills across domains like humans do. AdvertisementAdvertisementTransformers' opacity and the scale of the data they're pretrained on gave some the illusion that they generalize beyond it.
Persons: , Steve Yadlowsky, Nilesh Tripuraneni, they've, Pedro Domingos, AGI, Satya Nadella, Sam Altman, Arvind Narayanan, Jin Fan, G5mBuX6O36, Jim Fan, Domingos, it's, — Pedro Domingos, @pmddomingos, Sharon Zhou Organizations: Service, University of Washington, Microsoft, Nvidia, Google Locations: LLMs, Princeton, AGI
There's a new stack of hardware, software, tools, and services that will power AI applications for years to come. Cloud 2.0Another key point here: Most AI developers already know how to use CUDA and Nvidia GPUs. Arguably, Nvidia has already created an AI cloud platform – as AWS once did for the Cloud 1.0 era. James Hamilton is an AWS cloud infrastructure genius who can take on Nvidia, even if the chipmaker has a major head start. Her startup spent months building a data center from scratch to help customers train AI models.
Persons: , Jensen Huang, Nvidia Rick Wilking, Andrew Ng, CUDA, Michael Douglas, Bernstein, Douglas, Luis Ceze, Ceze, It's, Andy Jassy, Adam Selipsky, James Hamilton, Oren Etzioni, Claude, Dario Amodei, Anthropic Anthropic, Noah Berger, Sharon Zhou, Zhou, Lamini didn't, Etzioni Organizations: Amazon Web Services, Nvidia, Service, Home Depot, AWS, VMware, Cloud, Madrona Venture, Amazon, Amazon Web, Annapurna Labs, Intel, AMD Locations: San Francisco, Seattle, Selipsky
The company didn't disclose what training data was used to train Llama 2. The AI industry typically shares many details of AI training data sets. One way to avoid the issue is to just not tell anyone what data you used to train your AI model. Until now, the AI industry has been open about the training data used for models. That last data set made up more than two-thirds of the information Meta used to train LLaMA.
Persons: Meta, Rupert Murdoch, Sarah Silverman's, OpenAI, Halimah DeLaine Prado, Meta's, Sharon Zhou Organizations: Publishers, Wall Street Journal, Big Tech, Microsoft, SEC, European Union, Google, Meta Locations: EU
Lately, the giant AI model has become faster, but performance has declined. The world's most-powerful AI model has become, well, less powerful. It's considered the most-powerful AI model available broadly and is multimodal, which means it can understand images as well as text inputs. They think OpenAI is creating several smaller GPT-4 models that act similarly to the large model but are less expensive to run. This week, several AI experts posted what they claimed were details of GPT-4's architecture on Twitter.
Persons: OpenAI's, Peter Yang, I've, Frazier MacLeod, Christi Kennedy, OpenAI, ChatGPT, It's, Sharon Zhou, Theseus, Zhou, Yam, Semianalysis, George Hotz, Soumith Chintala, Oren Etzioni, Greg Brockman, " Brockman, Lilian Weng Organizations: Morning, Twitter, Roblox, Microsoft, Meta, Allen Institute, AI Locations: GPT
Total: 6